Skip to content

Stabilize the OSM reverse proxy#3

Merged
cyrossignol merged 26 commits intomainfrom
stabilize-proxy
Mar 7, 2026
Merged

Stabilize the OSM reverse proxy#3
cyrossignol merged 26 commits intomainfrom
stabilize-proxy

Conversation

@cyrossignol
Copy link
Collaborator

@cyrossignol cyrossignol commented Mar 5, 2026

This is a bag of changes that fix configuration, performance, and security issues needed to get the OSM reverse proxy function running reliably.

Please see the individual commits for details.

Summary by CodeRabbit

  • New Features

    • Added /api/capabilities.json and enabled CORS.
    • Enhanced authentication: validate JWTs against external provider and fetch project-group roles.
    • Introduced shared HTTP clients with app lifecycle management.
  • Bug Fixes

    • More resilient proxying with streaming, broader header forwarding, upstream error handling, and logging.
    • Clearer 400/403 responses for malformed workspace headers and stronger workspace-access checks.
    • Safer workspace ID handling to reduce injection risk.
  • Chores

    • Updated default proxy host configuration.

@cyrossignol cyrossignol requested a review from jeffmaki March 5, 2026 18:33
@coderabbitai
Copy link

coderabbitai bot commented Mar 5, 2026

📝 Walkthrough

Walkthrough

Updates default OSM host and adds JWKS-based JWT validation, refactors token validation to call a TDEI backend asynchronously, introduces shared async HTTP clients and CORS middleware, and replaces the proxy with a streaming, header-forwarding implementation plus a /api/capabilities.json endpoint.

Changes

Cohort / File(s) Summary
Configuration
api/core/config.py
Changed default WS_OSM_HOST from "http://osm-rails:3000" to "http://osm-web" and added exported CORS_ORIGINS: list[str] = [].
Authentication / JWT
api/core/jwt.py
New module: singleton PyJWKClient and validate_and_decode_token(token: str) -> dict to validate RS256 JWTs against OIDC JWKS endpoint.
Security / Token Validation
api/core/security.py
Added shared TDEI async HTTP client lifecycle (init_tdei_client, close_tdei_client); replaced inline JWT decoding with validate_and_decode_token; changed validation flow to call TDEI for project-group memberships and construct UserInfo; added guards and casting for workspace IDs.
API Routing & Proxy
api/main.py
Added app lifespan to init/close shared async clients, CORS middleware, /api/capabilities.json endpoint, and a streaming catch-all proxy that forwards non-hop-by-hop headers, validates X-Workspace, handles timeouts/connection errors, and logs upstream errors.
Data Access
api/src/workspaces/repository.py
Interpolated workspace_id as int(workspace_id) in SET search_path to avoid injection risk when not invoked via FastAPI path handler; added explanatory comment.

Sequence Diagrams

sequenceDiagram
    participant Client
    participant API
    participant JWKS
    participant TDEI
    Client->>API: Request with Bearer token
    API->>JWKS: Fetch signing key from OIDC certs URL (PyJWKClient)
    JWKS-->>API: Return public key
    API->>API: Decode/validate JWT -> payload
    API->>TDEI: GET project-group memberships (Authorization)
    TDEI-->>API: Return roles / project groups
    API->>API: Build AccessibleWorkspaceIds and UserInfo
    API-->>Client: Return validated UserInfo / proceed with request
Loading
sequenceDiagram
    participant Client
    participant API
    participant OSM
    Client->>API: Request (path, headers, X-Workspace, body)
    API->>API: Parse/validate X-Workspace, filter hop-by-hop headers
    API->>OSM: Stream request via shared httpx.AsyncClient
    OSM-->>API: Stream response
    API->>API: Filter response headers (exclude hop-by-hop)
    API-->>Client: Stream response back
    Note over API: Handle timeouts, connection errors, and log upstream errors
Loading

Estimated code review effort

🎯 4 (Complex) | ⏱️ ~45 minutes

Possibly related PRs

Suggested reviewers

  • jeffmaki

Poem

🐰 Hoppy keys and streaming streams,
I nibble headers, chase the beams.
Tokens gleam like carrot light,
Workspaces hum through day and night.
🥕✨

🚥 Pre-merge checks | ✅ 2 | ❌ 1

❌ Failed checks (1 warning)

Check name Status Explanation Resolution
Docstring Coverage ⚠️ Warning Docstring coverage is 30.77% which is insufficient. The required threshold is 80.00%. Write docstrings for the functions missing them to satisfy the coverage threshold.
✅ Passed checks (2 passed)
Check name Status Explanation
Description Check ✅ Passed Check skipped - CodeRabbit’s high-level summary is enabled.
Title check ✅ Passed The title 'Stabilize the OSM reverse proxy' accurately summarizes the main objective of this pull request, which involves fixing configuration, performance, and security issues in the OSM reverse proxy implementation across multiple modules.

✏️ Tip: You can configure your own custom pre-merge checks in the settings.

✨ Finishing Touches
  • 📝 Generate docstrings (stacked PR)
  • 📝 Generate docstrings (commit on current branch)
🧪 Generate unit tests (beta)
  • Create PR with unit tests
  • Post copyable unit tests in a comment
  • Commit unit tests in branch stabilize-proxy

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 4

🧹 Nitpick comments (2)
api/main.py (1)

136-147: Consider: Use raise ... from None for cleaner exception chaining.

The static analyzer flags that re-raising exceptions without from can make tracebacks confusing. Since you're intentionally raising a new HTTPException, using from None explicitly suppresses the original exception chain:

Optional improvement
     except httpx.TimeoutException:
         raise HTTPException(
             status_code=status.HTTP_504_GATEWAY_TIMEOUT,
             detail="Upstream OSM service timed out",
-        )
+        ) from None
     except httpx.ConnectError:
         raise HTTPException(
             status_code=status.HTTP_502_BAD_GATEWAY,
             detail="Could not connect to upstream OSM service",
-        )
+        ) from None
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@api/main.py` around lines 136 - 147, Update the exception re-raises in the
try/except around _osm_client.send(rp_req, stream=True) to suppress the original
exception chain by using "raise HTTPException(...) from None" for both the
httpx.TimeoutException and httpx.ConnectError handlers; locate the block
handling rp_resp and replace the current raises so the new HTTPException
(status.HTTP_504_GATEWAY_TIMEOUT and status.HTTP_502_BAD_GATEWAY respectively)
are raised with "from None" to produce cleaner tracebacks.
api/core/jwt.py (1)

21-33: Blocking HTTP call in async context and silent failure on missing payload.

Two concerns:

  1. get_signing_key_from_jwt() (line 23) makes a synchronous HTTP call, which can block the event loop. The TODO on line 22 acknowledges this, but consider prioritizing the async migration.

  2. Line 33 returns an empty dict if "payload" is missing, which would cause validate_token to raise a credentials exception due to missing "sub". Consider raising explicitly here for clearer error attribution:

Suggested improvement
-    return decoded.get("payload", {})
+    payload = decoded.get("payload")
+    if payload is None:
+        raise ValueError("Token payload missing")
+    return payload
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@api/core/jwt.py` around lines 21 - 33, The validate_and_decode_token function
currently blocks by calling _get_jwks_client().get_signing_key_from_jwt(token)
synchronously and silently returns {} when payload is missing; change this by
using an async JWKS fetcher (replace the synchronous get_signing_key_from_jwt
call with an awaitable async client or an async wrapper so the event loop is not
blocked) and after calling jwt.decode_complete ensure you explicitly raise a
clear exception (e.g., InvalidTokenError or a custom error) if
decoded.get("payload") is None or missing required fields (like "sub") so
callers such as validate_token receive a precise error instead of an empty dict.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Inline comments:
In `@api/core/security.py`:
- Around line 187-211: The token validation currently creates an
httpx.AsyncClient per request in _validate_token_uncached which breaks
connection pooling; replace that pattern with a module-level reusable client
(e.g., a _tdei_client and accessor get_tdei_client()) initialized with
base_url=settings.TDEI_BACKEND_URL and appropriate timeouts, then call
get_tdei_client().get(...) instead of creating AsyncClient in the function and
keep the existing response handling and mapping into UserInfoPGMembership
unchanged.
- Around line 134-156: The function's control flow and indentation around the
token cache are broken causing a syntax error; fix by restructuring so token
validation happens in correct order: first call validate_and_decode_token(token)
inside a try/except that raises credentials_exception on failure and extract
user_id from payload, then check the cache with if token in _token_cache: log
the cache hit and return _token_cache[token]; only after cache miss, call
user_info = await _validate_token_uncached(token, user_id, payload,
osm_db_session, task_db_session) and then set _token_cache[token] = user_info
before returning user_info; ensure credentials_exception, _token_cache,
validate_and_decode_token, and _validate_token_uncached are referenced in the
corrected blocks with proper indentation.

In `@api/main.py`:
- Around line 173-190: Remove the unused local variable authorizedWorkspace (and
its assignments) from the workspace-header handling block in the request
validation flow: locate the header check that reads
request.headers.get("X-Workspace") and the subsequent workspace_id parsing and
permission check using current_user.isWorkspaceContributor(workspace_id), then
delete the authorizedWorkspace = None initialization and the final
authorizedWorkspace = workspace_id assignment (or, if you intend to use it
later, add a clear comment explaining it is reserved for future use). Ensure
behavior and error handling (ValueError -> HTTPException and permission
HTTPException) remain unchanged.
- Around line 204-212: The forwarded headers currently include the incoming Host
header; update the header filtering so the Host is removed before calling
client.build_request (so httpx can derive it from the upstream URL). Modify the
HOP_BY_HOP_HEADERS set to include "host" (or add an explicit filter checking
k.lower() == "host") so request.headers iteration excludes Host when
constructing new_headers used by client.build_request and ensure the unchanged
symbols request.headers, HOP_BY_HOP_HEADERS, and client.build_request are the
points you change.

---

Nitpick comments:
In `@api/core/jwt.py`:
- Around line 21-33: The validate_and_decode_token function currently blocks by
calling _get_jwks_client().get_signing_key_from_jwt(token) synchronously and
silently returns {} when payload is missing; change this by using an async JWKS
fetcher (replace the synchronous get_signing_key_from_jwt call with an awaitable
async client or an async wrapper so the event loop is not blocked) and after
calling jwt.decode_complete ensure you explicitly raise a clear exception (e.g.,
InvalidTokenError or a custom error) if decoded.get("payload") is None or
missing required fields (like "sub") so callers such as validate_token receive a
precise error instead of an empty dict.

In `@api/main.py`:
- Around line 136-147: Update the exception re-raises in the try/except around
_osm_client.send(rp_req, stream=True) to suppress the original exception chain
by using "raise HTTPException(...) from None" for both the
httpx.TimeoutException and httpx.ConnectError handlers; locate the block
handling rp_resp and replace the current raises so the new HTTPException
(status.HTTP_504_GATEWAY_TIMEOUT and status.HTTP_502_BAD_GATEWAY respectively)
are raised with "from None" to produce cleaner tracebacks.

ℹ️ Review info
⚙️ Run configuration

Configuration used: defaults

Review profile: CHILL

Plan: Pro

Run ID: 641bb4b9-c231-4da0-b717-50b3bec8ca2b

📥 Commits

Reviewing files that changed from the base of the PR and between a6f4c8f and 10bd4e5.

📒 Files selected for processing (5)
  • api/core/config.py
  • api/core/jwt.py
  • api/core/security.py
  • api/main.py
  • api/src/workspaces/repository.py

Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 4

Caution

Some comments are outside the diff and can’t be posted inline due to platform limitations.

⚠️ Outside diff range comments (1)
api/main.py (1)

1-7: ⚠️ Potential issue | 🟡 Minor

Fix import grouping to unblock CI (isort).

contextlib is standard library and should be grouped with stdlib imports; current ordering is causing the pipeline failure.

Suggested fix
 import os
 import re
+from contextlib import asynccontextmanager
 
 import httpx
 import sentry_sdk
-from contextlib import asynccontextmanager
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@api/main.py` around lines 1 - 7, The imports are mis-grouped causing isort
failures; move the contextlib import (specifically asynccontextmanager from
contextlib) into the standard library import block with os and re so stdlib
imports are contiguous, leaving third-party imports (httpx, sentry_sdk)
separate; update the import ordering in api/main.py so that "from contextlib
import asynccontextmanager" is listed alongside "import os" and "import re".
♻️ Duplicate comments (2)
api/main.py (2)

181-182: ⚠️ Potential issue | 🟡 Minor

Remove unused authorizedWorkspace assignments.

Line 181 and Line 198 assign authorizedWorkspace but it is never read.

Suggested fix
-    authorizedWorkspace = None
-
     if request.headers.get("X-Workspace") is not None:
@@
-        authorizedWorkspace = workspace_id

Also applies to: 198-198

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@api/main.py` around lines 181 - 182, The variable authorizedWorkspace is
being assigned (e.g., "authorizedWorkspace = None") but never read; remove the
unused assignments and any declaration of authorizedWorkspace from the affected
scope (search for "authorizedWorkspace" occurrences) or, if the variable was
intended to be used, implement the intended usage; otherwise simply delete the
variable and its assignments so dead code is eliminated (ensure imports or other
references are not left dangling).

113-124: ⚠️ Potential issue | 🟠 Major

Still forwarding Host to upstream; filter it out.

Host is not excluded in HOP_BY_HOP_HEADERS, so the incoming host can be forwarded at Line 213-Line 217. This can misroute upstream requests.

Suggested fix
 HOP_BY_HOP_HEADERS = frozenset(
     [
+        "host",
         "connection",
         "keep-alive",
         "proxy-authenticate",

Also applies to: 213-217

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@api/main.py` around lines 113 - 124, The proxy is still forwarding the
incoming Host header because "host" is not in HOP_BY_HOP_HEADERS; update
HOP_BY_HOP_HEADERS to include "host" (lowercase) and ensure the header-filtering
code that builds the upstream request (the block that filters/forwards headers
around the current header-forwarding logic) performs case-insensitive
comparisons against HOP_BY_HOP_HEADERS so Host is never forwarded to upstream.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Inline comments:
In `@api/core/jwt.py`:
- Around line 14-16: The JWKS URL construction using
f"{settings.TDEI_OIDC_URL}realms/{settings.TDEI_OIDC_REALM}/protocol/openid-connect/certs"
can produce a malformed URL when settings.TDEI_OIDC_URL lacks a trailing slash;
update the construction in the jwt module to normalize the base URL (e.g., call
settings.TDEI_OIDC_URL.rstrip('/') or use urllib.parse.urljoin) before
concatenating the realms/{settings.TDEI_OIDC_REALM}/protocol/... path so there
is exactly one slash between the base and the path, preserving
settings.TDEI_OIDC_REALM usage.

In `@api/core/security.py`:
- Around line 204-205: Replace the direct call to UUID(user_id) with guarded
parsing: wrap UUID(user_id) in a try/except that catches ValueError (using the
same scope where r.user_uuid and payload/user_id are handled) and treat a
ValueError as an authentication failure instead of letting it propagate (e.g.,
raise the same auth/HTTP exception you use for invalid tokens); ensure you
reference r.user_uuid, UUID(user_id) and the user_id variable from the payload
and keep r.user_name assignment unchanged.
- Around line 209-213: Wrap the call to _tdei_client.get(...) (the call that
assigns response and uses user_id and headers) in a try/except that catches
httpx.RequestError and translates it into an HTTP 502 Bad Gateway error;
specifically, in the function where _tdei_client.get(...) is invoked, catch
httpx.RequestError and raise an HTTPException with status_code=502 and a concise
message (including the original exception message) so
transport/timeout/connection failures are returned as 502 instead of a generic
500.

In `@api/main.py`:
- Around line 76-83: The CORS config uses allow_origin_regex=r".*" together with
allow_credentials=True which allows credentialed requests from any origin;
update the app.add_middleware(CORSMiddleware, ...) configuration to remove the
open regex and instead supply an explicit allowlist via allow_origins (or a
tightly anchored regex) containing only trusted origins (e.g., your production
domains) and keep allow_credentials=True only when those origins are explicit;
modify the call referencing allow_origin_regex to use allow_origins=[...] (or a
secure anchored regex) and ensure allow_methods, allow_headers,
allow_credentials and max_age remain as needed.

---

Outside diff comments:
In `@api/main.py`:
- Around line 1-7: The imports are mis-grouped causing isort failures; move the
contextlib import (specifically asynccontextmanager from contextlib) into the
standard library import block with os and re so stdlib imports are contiguous,
leaving third-party imports (httpx, sentry_sdk) separate; update the import
ordering in api/main.py so that "from contextlib import asynccontextmanager" is
listed alongside "import os" and "import re".

---

Duplicate comments:
In `@api/main.py`:
- Around line 181-182: The variable authorizedWorkspace is being assigned (e.g.,
"authorizedWorkspace = None") but never read; remove the unused assignments and
any declaration of authorizedWorkspace from the affected scope (search for
"authorizedWorkspace" occurrences) or, if the variable was intended to be used,
implement the intended usage; otherwise simply delete the variable and its
assignments so dead code is eliminated (ensure imports or other references are
not left dangling).
- Around line 113-124: The proxy is still forwarding the incoming Host header
because "host" is not in HOP_BY_HOP_HEADERS; update HOP_BY_HOP_HEADERS to
include "host" (lowercase) and ensure the header-filtering code that builds the
upstream request (the block that filters/forwards headers around the current
header-forwarding logic) performs case-insensitive comparisons against
HOP_BY_HOP_HEADERS so Host is never forwarded to upstream.

ℹ️ Review info
⚙️ Run configuration

Configuration used: defaults

Review profile: CHILL

Plan: Pro

Run ID: bc80b69b-9547-45fb-8db0-ddf878ee37aa

📥 Commits

Reviewing files that changed from the base of the PR and between 10bd4e5 and 4e1ebb8.

📒 Files selected for processing (5)
  • api/core/config.py
  • api/core/jwt.py
  • api/core/security.py
  • api/main.py
  • api/src/workspaces/repository.py
🚧 Files skipped from review as they are similar to previous changes (1)
  • api/src/workspaces/repository.py

Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Caution

Some comments are outside the diff and can’t be posted inline due to platform limitations.

⚠️ Outside diff range comments (1)
api/main.py (1)

1-7: ⚠️ Potential issue | 🟠 Major

Fix import ordering to unblock CI.

Line [6] places a stdlib import after third-party imports, which matches the isort pipeline failure.

Proposed fix
 import os
 import re
+from contextlib import asynccontextmanager
 
 import httpx
 import sentry_sdk
-from contextlib import asynccontextmanager
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@api/main.py` around lines 1 - 7, Reorder the imports so standard library
modules come first: move "from contextlib import asynccontextmanager" up with
"import os" and "import re", followed by third-party imports "import httpx" and
"import sentry_sdk"; ensure the import groups follow the project isort style
(stdlib, third-party) to resolve the CI isort failure.
♻️ Duplicate comments (2)
api/main.py (2)

204-221: ⚠️ Potential issue | 🟠 Major

Remove unused authorizedWorkspace (ruff F841).

Lines [204] and [221] assign a local that is never read.

Proposed fix
-    authorizedWorkspace = None
-
     if request.headers.get("X-Workspace") is not None:
         try:
             workspace_id = int(request.headers.get("X-Workspace") or "-1")
         except ValueError:
             raise HTTPException(
                 status_code=status.HTTP_400_BAD_REQUEST,
                 detail="X-Workspace header must be a valid integer",
             )
@@
-        authorizedWorkspace = workspace_id
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@api/main.py` around lines 204 - 221, The local variable authorizedWorkspace
is assigned but never used (ruff F841); remove the unused declaration and all
assignments to authorizedWorkspace in this block, or if intended, replace its
usage by passing workspace_id where needed; locate the code around the
header-parsing logic that uses request.headers.get("X-Workspace") and the
current_user.isWorkspaceContributor(workspace_id) check and either drop the
authorizedWorkspace variable entirely or wire workspace_id into the subsequent
logic that requires the authorized workspace.

76-83: ⚠️ Potential issue | 🟠 Major

Tighten CORS: credentialed wildcard policy is still open.

Line [78] (allow_origin_regex=r".*") with Line [79] (allow_credentials=True) effectively permits credentialed cross-origin reads from any origin.

Suggested direction
 app.add_middleware(
     CORSMiddleware,
-    allow_origin_regex=r".*",
+    # Use an explicit trusted-origin allowlist from config/env.
+    allow_origins=settings.CORS_ALLOWED_ORIGINS,
     allow_credentials=True,
-    allow_methods=["*"],
-    allow_headers=["*"],
+    allow_methods=["GET", "POST", "PUT", "DELETE", "OPTIONS", "HEAD", "PATCH"],
+    allow_headers=settings.CORS_ALLOWED_HEADERS,
     max_age=100,
 )
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@api/main.py` around lines 76 - 83, The CORS setup using app.add_middleware
with CORSMiddleware currently sets allow_origin_regex=r".*" while
allow_credentials=True, which allows credentialed requests from any origin;
change this to a safe configuration by removing the permissive regex and either
(a) set allow_origins to a strict list of allowed origins (e.g., from
config/env) and keep allow_credentials only if those origins are explicit, or
(b) if you must use a pattern, replace allow_origin_regex with a specific,
tightened regex that matches only trusted domains, or (c) disable
allow_credentials (set to False) if wildcard/regex origins are required; update
the CORSMiddleware call (app.add_middleware, CORSMiddleware, allow_origin_regex,
allow_origins, allow_credentials) accordingly.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Outside diff comments:
In `@api/main.py`:
- Around line 1-7: Reorder the imports so standard library modules come first:
move "from contextlib import asynccontextmanager" up with "import os" and
"import re", followed by third-party imports "import httpx" and "import
sentry_sdk"; ensure the import groups follow the project isort style (stdlib,
third-party) to resolve the CI isort failure.

---

Duplicate comments:
In `@api/main.py`:
- Around line 204-221: The local variable authorizedWorkspace is assigned but
never used (ruff F841); remove the unused declaration and all assignments to
authorizedWorkspace in this block, or if intended, replace its usage by passing
workspace_id where needed; locate the code around the header-parsing logic that
uses request.headers.get("X-Workspace") and the
current_user.isWorkspaceContributor(workspace_id) check and either drop the
authorizedWorkspace variable entirely or wire workspace_id into the subsequent
logic that requires the authorized workspace.
- Around line 76-83: The CORS setup using app.add_middleware with CORSMiddleware
currently sets allow_origin_regex=r".*" while allow_credentials=True, which
allows credentialed requests from any origin; change this to a safe
configuration by removing the permissive regex and either (a) set allow_origins
to a strict list of allowed origins (e.g., from config/env) and keep
allow_credentials only if those origins are explicit, or (b) if you must use a
pattern, replace allow_origin_regex with a specific, tightened regex that
matches only trusted domains, or (c) disable allow_credentials (set to False) if
wildcard/regex origins are required; update the CORSMiddleware call
(app.add_middleware, CORSMiddleware, allow_origin_regex, allow_origins,
allow_credentials) accordingly.

ℹ️ Review info
⚙️ Run configuration

Configuration used: defaults

Review profile: CHILL

Plan: Pro

Run ID: d7759c1e-73f0-40b7-a67b-87b30d3192c0

📥 Commits

Reviewing files that changed from the base of the PR and between 4e1ebb8 and 3ab04d4.

📒 Files selected for processing (2)
  • api/core/jwt.py
  • api/main.py
🚧 Files skipped from review as they are similar to previous changes (1)
  • api/core/jwt.py

Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🧹 Nitpick comments (4)
api/core/security.py (2)

231-234: Add exception chaining for JSON parsing error.

Same pattern as above - use from None to suppress the original traceback when converting to auth failure.

Proposed fix
     try:
         pg_data = response.json()
     except Exception:
-        raise credentials_exception
+        raise credentials_exception from None
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@api/core/security.py` around lines 231 - 234, The JSON parsing try/except
around response.json() should chain the exception with "from None" when raising
credentials_exception to suppress the underlying traceback; update the except
block that catches Exception from response.json() to raise credentials_exception
from None so the auth failure mirrors the existing pattern (look for the
response.json() call and the credentials_exception symbol in this file).

159-162: Add exception chaining for cleaner tracebacks.

The static analyzer correctly identifies that raise ... from None should be used to suppress the original exception traceback, which is appropriate here since you're converting the original error to an auth failure.

Proposed fix
     try:
         payload = validate_and_decode_token(token)
     except Exception:
-        raise credentials_exception
+        raise credentials_exception from None
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@api/core/security.py` around lines 159 - 162, The except block that catches
errors from validate_and_decode_token(token) should suppress the original
traceback by using exception chaining: change the raise to "raise
credentials_exception from None" so the auth failure replaces the original
exception; update the try/except around validate_and_decode_token(token) to use
that form referencing the existing credentials_exception symbol.
api/main.py (2)

146-161: Consider defensive None check for _osm_client.

While the lifespan manager should guarantee _osm_client is initialized before endpoints are called, direct access to the module-level variable (line 156, 164) could raise AttributeError if accessed during shutdown or if the lifespan fails. This is a low-probability edge case but worth considering for robustness.

Optional defensive pattern
# At the start of the endpoint:
if _osm_client is None:
    raise HTTPException(
        status_code=status.HTTP_503_SERVICE_UNAVAILABLE,
        detail="OSM proxy client not initialized",
    )

Or continue without the check if you're confident the lifespan guarantees are sufficient.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@api/main.py` around lines 146 - 161, Add a defensive None-check for the
module-level _osm_client at the start of the capabilities(request: Request)
endpoint to avoid AttributeError during shutdown or failed lifespan; if
_osm_client is None, raise fastapi.HTTPException with
status_code=status.HTTP_503_SERVICE_UNAVAILABLE and a short detail like "OSM
proxy client not initialized" before any access to _osm_client.base_url.host or
similar, keeping STRIP_REQUEST_HEADERS usage unchanged.

166-177: Add exception chaining for proxy error handling.

The static analyzer flags that exceptions raised within except clauses should use from None (or from err) to clarify the exception chain. This pattern appears multiple times in the file.

Proposed fix for this block and similar patterns
     try:
         rp_resp = await _osm_client.send(rp_req, stream=True)
     except httpx.TimeoutException:
         raise HTTPException(
             status_code=status.HTTP_504_GATEWAY_TIMEOUT,
             detail="Upstream OSM service timed out",
-        )
+        ) from None
     except httpx.ConnectError:
         raise HTTPException(
             status_code=status.HTTP_502_BAD_GATEWAY,
             detail="Could not connect to upstream OSM service",
-        )
+        ) from None

Apply the same pattern to lines 208-211, 249-252, and 254-257.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@api/main.py` around lines 166 - 177, The except handlers around the proxy
call (where rp_resp = await _osm_client.send(rp_req, stream=True)) raise new
HTTPException instances without exception chaining; update the
httpx.TimeoutException and httpx.ConnectError blocks to re-raise the
HTTPException using "from err" (capture the original exception as err) or "from
None" to satisfy the static analyzer, e.g., capture the caught exception (except
httpx.TimeoutException as err) and then raise
HTTPException(status_code=status.HTTP_504_GATEWAY_TIMEOUT, detail="Upstream OSM
service timed out") from err; apply the same pattern to the other similar blocks
referenced in the comment (the handlers around lines 208-211, 249-252, 254-257)
so all raises include explicit exception chaining.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Nitpick comments:
In `@api/core/security.py`:
- Around line 231-234: The JSON parsing try/except around response.json() should
chain the exception with "from None" when raising credentials_exception to
suppress the underlying traceback; update the except block that catches
Exception from response.json() to raise credentials_exception from None so the
auth failure mirrors the existing pattern (look for the response.json() call and
the credentials_exception symbol in this file).
- Around line 159-162: The except block that catches errors from
validate_and_decode_token(token) should suppress the original traceback by using
exception chaining: change the raise to "raise credentials_exception from None"
so the auth failure replaces the original exception; update the try/except
around validate_and_decode_token(token) to use that form referencing the
existing credentials_exception symbol.

In `@api/main.py`:
- Around line 146-161: Add a defensive None-check for the module-level
_osm_client at the start of the capabilities(request: Request) endpoint to avoid
AttributeError during shutdown or failed lifespan; if _osm_client is None, raise
fastapi.HTTPException with status_code=status.HTTP_503_SERVICE_UNAVAILABLE and a
short detail like "OSM proxy client not initialized" before any access to
_osm_client.base_url.host or similar, keeping STRIP_REQUEST_HEADERS usage
unchanged.
- Around line 166-177: The except handlers around the proxy call (where rp_resp
= await _osm_client.send(rp_req, stream=True)) raise new HTTPException instances
without exception chaining; update the httpx.TimeoutException and
httpx.ConnectError blocks to re-raise the HTTPException using "from err"
(capture the original exception as err) or "from None" to satisfy the static
analyzer, e.g., capture the caught exception (except httpx.TimeoutException as
err) and then raise HTTPException(status_code=status.HTTP_504_GATEWAY_TIMEOUT,
detail="Upstream OSM service timed out") from err; apply the same pattern to the
other similar blocks referenced in the comment (the handlers around lines
208-211, 249-252, 254-257) so all raises include explicit exception chaining.

ℹ️ Review info
⚙️ Run configuration

Configuration used: defaults

Review profile: CHILL

Plan: Pro

Run ID: 75343760-6e79-483a-82e5-3216c25eeb84

📥 Commits

Reviewing files that changed from the base of the PR and between 3ab04d4 and 3d0c056.

📒 Files selected for processing (4)
  • api/core/config.py
  • api/core/jwt.py
  • api/core/security.py
  • api/main.py
🚧 Files skipped from review as they are similar to previous changes (1)
  • api/core/config.py

@cyrossignol cyrossignol merged commit e9f7e93 into main Mar 7, 2026
2 of 3 checks passed
@cyrossignol cyrossignol deleted the stabilize-proxy branch March 7, 2026 22:31
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants